Semantic Relation Classification by Bi-directional LSTM Architecture

نویسندگان

  • Menglong Wu
  • Lin Liu
  • Wenxi Yao
  • Chunyong Yin
  • Jin Wang
چکیده

Semantic relation extraction is a meaningful task in NLP that could provide some helpful information and semantic relation classification attracts many people to research it. This paper mainly introduces a Bi-direction LSTM (long short-term memory) deep neutral network and the parameter of embedding layer, and this network can solve the problem of over-fitting. And then according to the text of dataset, I propose a new idea to optimize the input structure and sequence padding.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Syntax Aware LSTM model for Semantic Role Labeling

In Semantic Role Labeling (SRL) task, the tree structured dependency relation is rich in syntax information, but it is not well handled by existing models. In this paper, we propose Syntax Aware Long Short Time Memory (SA-LSTM). The structure of SA-LSTM changes according to dependency structure of each sentence, so that SA-LSTM can model the whole tree structure of dependency relation in an arc...

متن کامل

Multi-Domain Joint Semantic Frame Parsing Using Bi-Directional RNN-LSTM

Sequence-to-sequence deep learning has recently emerged as a new paradigm in supervised learning for spoken language understanding. However, most of the previous studies explored this framework for building single domain models for each task, such as slot filling or domain classification, comparing deep learning based approaches with conventional ones like conditional random fields. This paper ...

متن کامل

YZU-NLP at EmoInt-2017: Determining Emotion Intensity Using a Bi-directional LSTM-CNN Model

The EmoInt-2017 task aims to determine a continuous numerical value representing the intensity to which an emotion is expressed in a tweet. Compared to classification tasks that identify 1 among n emotions for a tweet, the present task can provide more fine-grained (real-valued) sentiment analysis. This paper presents a system that uses a bi-directional LSTM-CNN model to complete the competitio...

متن کامل

Deep Learning for Query Semantic Domains Classification

6 Long Short Term Memory (LSTM), a type of recurrent neural network, has 7 been widely used for Language Model. One of the application is speech query 8 domain classification where LSTM is shown to be more effective than 9 traditional statistic models and feedforward neural networks. Different from 10 speech queries, text queries to search engines are usually shorter and lack of 11 correct gram...

متن کامل

Contrastive Learning of Emoji-based Representations for Resource-Poor Languages

The introduction of emojis (or emoticons) in social media platforms has given the users an increased potential for expression. We propose a novel method called Classification of Emojis using Siamese Network Architecture (CESNA) to learn emoji-based representations of resource-poor languages by jointly training them with resource-rich languages using a siamese network. CESNA model consists of tw...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017